MiDGaP: Mixture Density Gaussian Processes
نویسنده
چکیده
Gaussian Processes (GPs) have become a core technique in machine learning over the last decade, with numerous extensions and applications. Although several approaches exist for warping the conditional Gaussian posterior distribution to other members of the exponential family, most tacitly assume a unimodal posterior. In this paper we present a mixture density model (MDM) allowing multi-modal posterior distributions from GPs. We make explicit comparison with alternate models, namely the Mixture Density Network (MDN) and Mixture of GP Experts (GPE). Unlike MDN approaches, we allow full probability distributions over the latent variables that encode the mixture posterior, allowing uncertainty to propagate in a principled manner. Unlike the GPE methods, we achieve nonGaussian posteriors within a single GP model. We showcase the performance of the approach on synthetic and real timeseries data sets. Our results indicate that not only is the approach competitive in terms of error metrics but also provides further insight into the multiplicity of potential paths a timeseries may take in the future.
منابع مشابه
On-line statistical monitoring of batch processes using Gaussian mixture model
The statistical monitoring of batch manufacturing processes is considered. It is known that conventional monitoring approaches, e.g. principal component analysis (PCA), are not applicable when the normal operating conditions of the process cannot be sufficiently represented by a Gaussian distribution. To address this issue, Gaussian mixture model (GMM) has been proposed to estimate the probabil...
متن کاملNovel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection
In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...
متن کاملGaussian Process Covariance Kernels for Pattern Discovery and Extrapolation
Gaussian processes are rich distributions over functions, which provide a Bayesian nonparametric approach to smoothing and interpolation. We introduce simple closed form kernels that can be used with Gaussian processes to discover patterns and enable extrapolation. These kernels are derived by modelling a spectral density – the Fourier transform of a kernel – with a Gaussian mixture. The propos...
متن کاملGaussian Process Kernels for Pattern Discovery and Extrapolation
Gaussian processes are rich distributions over functions, which provide a Bayesian nonparametric approach to smoothing and interpolation. We introduce simple closed form kernels that can be used with Gaussian processes to discover patterns and enable extrapolation. These kernels are derived by modelling a spectral density – the Fourier transform of a kernel – with a Gaussian mixture. The propos...
متن کاملParameter estimation for autoregressive Gaussian-mixture processes: the EMAX algorithm
The problem of estimating parameters of discrete-time non-Gaussian autoregressive (AR) processes is addressed. The subclass of such processes considered is restricted to those whose driving noise samples are statistically independent and identically distributed according to a Gaussian-mixture probability density function (pdf). Because the likelihood function for this problem is typically unbou...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2018